First-Order Methods for Convex Optimization

نویسندگان

چکیده

First-order methods for solving convex optimization problems have been at the forefront of mathematical in last 20 years. The rapid development this important class algorithms is motivated by success stories reported various applications, including most importantly machine learning, signal processing, imaging and control theory. potential to provide low accuracy solutions computational complexity which makes them an attractive set tools large-scale problems. In survey, we cover a number key developments gradient-based methods. This includes non-Euclidean extensions classical proximal gradient method, its accelerated versions. Additionally survey recent within projection-free methods, versions primal-dual schemes. We give complete proofs results, highlight unifying aspects several algorithms.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

First-order Methods for Geodesically Convex Optimization

Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove ...

متن کامل

Fast First-Order Methods for Composite Convex Optimization with Backtracking

We propose new versions of accelerated first order methods for convex composite optimization, where the prox parameter is allowed to increase from one iteration to the next. In particular we show that a full backtracking strategy can be used within the FISTA [1] and FALM algorithms [7] while preserving their worst-case iteration complexities of O( √ L(f)/ ). In the original versions of FISTA an...

متن کامل

Proximal and First-Order Methods for Convex Optimization

We describe the proximal method for minimization of convex functions. We review classical results, recent extensions, and interpretations of the proximal method that work in online and stochastic optimization settings.

متن کامل

Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization

Parallel computing has played an important role in speeding up convex optimization methods for big data analytics and large-scale machine learning (ML). However, the scalability of these optimization methods is inhibited by the cost of communicating and synchronizing processors in a parallel setting. Iterative ML methods are particularly sensitive to communication cost since they often require ...

متن کامل

First-order methods of smooth convex optimization with inexact oracle

In this paper, we analyze different first-order methods of smooth convex optimization employing inexact first-order information. We introduce the notion of an approximate first-order oracle. The list of examples of such an oracle includes smoothing technique, Moreau-Yosida regularization, Modified Lagrangians, and many others. For different methods, we derive complexity estimates and study the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: EURO journal on computational optimization

سال: 2021

ISSN: ['2192-4406', '2192-4414']

DOI: https://doi.org/10.1016/j.ejco.2021.100015